Security, Video Intelligence, and Metadata Integration

From RidgeRun Developer Wiki



Preferred Partner Logo 3




Security, Video Intelligence & Metadata Integration


What is Security, Video Intelligence & Metadata Integration?

In many video systems, it’s not just the visuals that matter—extra information (called metadata) can be added to the video stream to make it smarter. This metadata might include GPS location, timestamps, sensor readings, object IDs, or tracking data. It’s embedded into the video so it can travel along with the footage and be used for real-time analysis or post-event review.

This kind of integration is especially useful in surveillance, defense, and law enforcement, where knowing the who, what, and where behind a video frame is critical.

Real Use Case Scenario

A border patrol agency uses aerial drones to monitor a wide area. The drones record high-resolution video while flying and also collect metadata such as the drone’s GPS position, camera angle, and detection zones. Instead of sending this data separately, the system embeds it into the video stream itself.

When the footage is reviewed later, analysts can not only watch what happened but also know exactly where and when each event took place. This makes it easier to track movement across zones, verify incidents, and respond more effectively. The embedded metadata also enables automatic systems to highlight areas of interest—like detecting vehicles or people entering restricted zones.

Drone Metadata Analysis

How can RDS help you build your Security, Video Intelligence & Metadata Integration System?

The RidgeRun Development Suite (RDS) enables advanced metadata workflows through a variety of modules that handle embedded (in-band) metadata, including formats like SEI, OBU, and MISB-compliant KLV streams. These modules allow video streams to carry essential data such as timestamps, coordinates, object tracking, or sensor readings—crucial for mission-critical systems.

This functionality is powered by a set of integrated RidgeRun plugins, including:

  • GstSEIMetadata – for inserting any metadata directly into an H264/H265 encoded video.
  • In-band metadata – for inserting any metadata into Transport Stream.
  • GstObuMetadata – for inserting any metadata into an AV1 encoded video.
  • LibMISB - to encode/decode metata into a MISB standard.
  • GstRtspSink - to stream via RTSP.

See RDS in action for Security, Video Intelligence and Metadata Integration

The easiest way to see our products in action is by running the included demo applications. The Metadata Demo application is designed to show you how RDS can help you build a Security, Video Intelligence and Metadata Integration system where the user can select between the following metadata options developed by RidgeRun: SEI, OBU, In-band metadata ,and MISB-compliant KLV streams. In order to run the demo application, follow these steps:

1. Start RR-Media demo application

rr-media

2. Select Metadata Demo from the application menu

Available Plugins
3. Metadata Demo
Select plugin [0/1/2/3/4/5/6/7]: 3

3. Start the demo by selecting Run

▶ Metadata Demo
┌──────┬──────────────────────────────┐
│ 1    │ Video Codec (H264)           │
│ 2    │ Meta Encoding (MISB)         │
│ 3    │ Meta Container (SEI)         │
│ 4    │ Muxing (TS)                  │
│ 5    │ Sink Type (FILE)             │
│ 6    │ Performance monitoring (OFF) │
│ 7    │ Run                          │
│ 8    │ Back                         │
│ 9    │ Exit                         │
└──────┴──────────────────────────────┘

You can configure the demo as follows.

1 Video Codec Options:

  • H264
  • JPEG
  • AV1

2 Metadata Encoding:

  • MISB
  • Raw

3 Metadata Container:

  • SEI → Only works with H264
  • OBU → Only works with AV1 and MP4 Mux
  • TS → Only with TS Mux

4 Muxing:

  • MP4
  • TS

5 Sink Type:

  • File
  • RTSP → Does NOT support: AV1 codec and OBU metadata

The data being injected is displayed in the terminal.

2025-07-03 09:04:33,702 - toolkit-demo - plugin.py     223 -    INFO: 
Precision Time Stamp: _________________ Jul. 03, 2025. 09:04:33.697
Sensor True Altitude: _________________ 999.966430
UAS Datalink LS Version Number: _______ 19


Info
When the demo starts, the GStreamer pipeline used by the demo will be shown in the console. You can use it as a reference to run your own version.


Info
For extracting the metadata after the demo starts, the GStreamer pipeline will be displayed after the following message: In order to see the inserted metadata, run the following pipeline


Using the pipeline provided at the beginning you can extract the data, for example the output would be the following.

0:00:01.692757039  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> ---------------------------------------------------------------------------
0:00:01.692790192  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> The extracted data is: 
0:00:01.692817200  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> 00000000: 06 0e 2b 34 02 0b 01 01 0e 01 03 01 01 00 00 00  ..+4............
0:00:01.692839313  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> 00000010: 15 02 08 00 06 39 07 f9 a2 3f 10 0f 02 18 71 41  .....9...?....qA
0:00:01.692857105  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> 00000020: 01 13 01 02 99 40                                .....@          
0:00:01.692871025  9307 0xaaab00b30180 MEMDUMP           seiextract gstseiextract.c:299:gst_sei_extract_extract_h264_data:<seiextract0> ---------------------------------------------------------------------------
0:00:01.726864595

Build your own Security, Video Intelligence and Metadata Integration system

Documentation

1. Start with RR-Media API

Now that you saw RDS in action, it's time to build your application. We recommend that you start by using RR-MEDIA API, this will allow you to quickly build your own Proof of concept (POC) with an easy-to-use Python API.


For this, we will need the following RR-Media modules:

  1. gst.source.test: used to generate a videotestsrc.
  2. jetson.sink.rtsp: used to stream via RTSP.
  3. jetson.sink.file: used to save a video file.

We will use the ModuleGraph module to build the following graph:

Sample graph for Security, Video Intelligence & Metadata Integration

Your Python script should look like this:

from rrmedia.media.core.base import  (MetadataContainer,
                                      MuxType,
                                     VideoCodec)
from rrmedia.media.core.factory import ModuleFactory
from rrmedia.media.core.graph import ModuleGraph

# Create graph
graph = ModuleGraph()

# Create the video test source (select your desire pattern)
graph.add(ModuleFactory.create(
     "gst.source.test",
      pattern=18, name="src"))

#Define the type of metadata, codec and mux
codec=VideoCodec.H264
meta_container=MetadataContainer.SEI
mux=MuxType.MP4


# File sink
graph.add(ModuleFactory.create(
      "jetson.sink.file",
      location="test.mp4",
      codec=codec,
      meta_container=meta_container,
      mux=mux,
      name="sink"))


# Or RTSP Sink with the PORT and mapping
graph.add(ModuleFactory.create(
      "jetson.sink.rtsp",
       port=12345,
       mapping="stream",
       codec=codec,
       meta_container=meta_container,
       mux=mux,
       name="sink"))

# Connect modules
graph.connect("src", "sink")

# Print pipeline
print("Graph pipeline: %s", graph.dump_launch())

# Start playback
graph.play()

# Start loop (this is a blocking function)
graph.loop()

After playing the pipeline you can inject raw or MISB metadata as follows:

# a) Inject raw metadata
timestamp = datetime.now().isoformat()
set_property("sink", "metadata", timestamp)

# b) Inject metadata with LibMISB
misb_encoder = LibMisb()
misb_encoder.set_formatter(JsonFormatter())

# Generate complex metadata
key = "060E2B34020B01010E01030101000000"
altitude = 1000
tags = []
meta_item = metadata_item()
meta_item.tag = "15"
meta_item.value = str(altitude)
tags.append(meta_item)
meta_item = metadata_item()
now = datetime.now()
formatted = now.strftime("%b. %d, %Y. %H:%M:%S") + \
            f".{now.microsecond // 1000:03d}"
meta_item.tag = "2"
meta_item.value = formatted
tags.append(meta_item)

meta = Metadata()
meta.set_key(key)
meta.set_items(tags)

data, status = misb_encoder.encode(meta)

# Inject LibMISB metadata
graph.set_property("sink", "metadata-binary", data)

When you run this script, you should see the injected data. When using RTSP you can see the output or extract the data once you finish generating the video file.


Info
When the script is run, a GStreamer pipeline will be printed in the console. You can run this pipeline using gst-launch to produce the same results the application produced.


2. Build or Customize your own pipeline

RR-Media is designed for easier and testing testing, however, in certain situations, more control is needed so you need to go deeper into the application. In that scenario you have to options:

1. Extend RR-Media to fulfill your needs

2. Build your own GStreamer pipeline.

In this section, we will cover (2). If you want to know how to extend RR-Media, go to RR-MEDIA API.

A good starting point is the GStreamer pipeline obtained while running the RR-Media application. You can use it as your base and start customizing according to your needs.

1. Select your input

When working with GStreamer, it's important to define the type of input you're using—whether it's an image, video file, or camera. Here are some examples:

For example, in the case of the MP4 video called <MP4_FILE>:

INPUT="filesrc location=<MP4 file> ! qtdemux ! h264parse ! decodebin ! queue "

For a camera using NVArgus with a specific sensor ID <Camera ID>:

INPUT="nvarguscamera sensor-id=<Camera ID> ! nvvidconv ! queue "

For videotestsrc use the following with your desire pattern <PATTERN>:

INPUT="videotestsrc pattern=<PATTERN> ! nvvidconv ! queue "


2. Metadata Setup

After defining the input, you can inject your metadata. For example by using seimetadata

METADATA="nvv4l2h264enc ! h264parse ! seiinject metadata="Hello World"

3. Output Options

You can choose how you want the output to be handled—whether you want to stream, or save the video.

To stream using RTSP with the desired <PORT>

OUTPUT="h264parse ! video/x-h264, stream-format=avc, mapping=stream1 ! rtspsink service=<PORT> async-handling=true"

To save the output locally:

OUTPUT="qtmux ! filesink location=Test.mp4 -e"

5. Final Pipeline

Finally, you can connect all components using gst-launch or GStreamer Daemon (GSTD):

gstd &

gstd-client pipeline_create p1 $INPUT ! $METADATA ! $OUTPUT

Then extract the data from the file generated or the RTSP Stream

GST_DEBUG=*seiextract*:MEMDUMP gst-launch-1.0 filesrc location=Test.mp4 ! queue ! qtdemux ! video/x-h264 ! seiextract !  h264parse ! avdec_h264 !  queue ! fakesink
GST_DEBUG=*sei*:MEMDUMP gst-launch-1.0 -v -e rtspsrc location=rtsp://TARGET_IP:5555/stream1 ! rtph264depay ! seiextract ! queue ! decodebin ! queue ! fakesink sync=true

Extend it Further

You can use In-band metadata to use another type of data as well with GstObuMetadata and LibMISB.